Probability Density Functions (PDFs): replace PMFs (Probability Mass Functions) from discrete random variables
Discrete Random Variables have PMFs $p_X(x)$ where the following holds true
A random variable is continuous if the following holds true for its PDF $f_X(x)$:
Units of PDF are probability per unit length:
Continuous Uniform Random Variable: rather than integers between a and b being possible (discrete case), any real number between a and b is possible
Expectation of Continuous Random Variables: average in a large number of independent repetitions of a probabilistic experiment $$E[X] = \int_{-\infty}^{\infty}xf_X(x)dx$$
Expected Value Rule for Uniform Random Variables:
$$E[g(X)] = \int_{-\infty}^{\infty}g(x)f_X(x)dx$$
Linearity of Expectations: $$E[aX + b] = aE[X] + b$$
Variance of Uniform Random Variables:
Standard Deviation of Uniform Random Variables:
$$\sigma_X = \sqrt{var(X)}$$
Variance Rules:
Mean and Variance of Uniform Continuous Random Variable:
Exponential Random Variables: single parameter $\lambda > 0 \to$ generally models waiting time until an event occurs
Memorylessness of Exponential Random Variable: again analogous to Geometric Discrete Random Variable
Cumulative Distribution Functions (CDFs): unifying representation of both discrete and continuous random variables
$$F_X(x) = P(X \leq x)$$
Normal or Gaussian Random Variable: key to probability (Central Limit Theorem to be discussed)
Standard Normal Form of Gaussian Random Variable: mean of $0$, variance of $1 \to$ simplest form $$N(0,1): f_X(x) = \frac{1}{\sqrt{2 \pi}} e^{\frac{-x^2}{2}}$$
General Form of Gaussian Random Variable: mean of $\mu$, variance of $\sigma^2$ $$N(\mu, \sigma^2): f_X(x) = \frac{1}{\sigma \sqrt{2 \pi}} e^{\frac{-(x - \mu)^2}{2 \sigma^2}}$$
Linear Functions of Normal Random Variables: normality is preserved when forming linear functions of other normal random variables (proven later)
Calculating Probabilities with Normal Random Variables:
Standardizing a Random Variable (Mean Normalization):
Conditional PDFs:
Total Probability Theorem and Total Expectation Theorem for Continuous Random Variables (PDFs):
Mixed Distributions $\mathbf{\to}$ both Discrete and Continuous Random Variables:
Joint Continuous Random Variables and Joint PDFs:
Joint and Marginal PDFs:
More than 2 Random Variables for Joint PDFs:
Continuous PDF Case $\mathbf{\to}$ Functions of More than 2 RVs, Expected Value Rule, Linearity of Expectations:
Joint CDFs:
Conditional PDFs when Conditioning on Another Random Variable:
Total Probability Theorem, Conditional Expectation, and Total Expectation Theorem for Continuous PDFs:
Independence of Continuous RVs and PDFs:
Independent Normal Random Variables:
Bayes Rule for Continuous Random Variables and PDFs:
Third Variation on Bayes Rule: One Discrete, One Continuous Random Variable (Mixed RVs):
Bayes Rule Example 1: Discrete Unknown, Continuous Measurement: From above summary slide: $$p_{K|Y}(k|y) = \frac{p_K(k)f_{Y|K}(y|k)}{f_Y(y)}\ \ with\ \ f_Y(y) = \sum\limits_{k'}p_K(k')f_{Y|K}(y|k')$$
Bayes Rule Example 2: Continuous Unknown, Discrete Measurement: From above summary slide: $$f_{Y|K}(y|k) = \frac{f_Y(y)p_{K|Y}(k|y)}{p_K(k)}\ \ with\ \ p_K(k) = \int f_Y(y')p_{K|Y}(k|y')dy'$$
#1: Question 1: